Goto

Collaborating Authors

 tzen and raginsky


Expressiveness Remarks for Denoising Diffusion Models and Samplers

Vargas, Francisco, Reu, Teodora, Kerekes, Anna

arXiv.org Artificial Intelligence

Denoising diffusion models are a class of generative models which have recently achieved state-of-the-art results across many domains. Gradual noise is added to the data using a diffusion process, which transforms the data distribution into a Gaussian. Samples from the generative model are then obtained by simulating an approximation of the time reversal of this diffusion initialized by Gaussian samples. Recent research has explored adapting diffusion models for sampling and inference tasks. In this paper, we leverage known connections to stochastic control akin to the F\"ollmer drift to extend established neural network approximation results for the F\"ollmer drift to denoising diffusion models and samplers.


Bayesian Learning via Neural Schr\"odinger-F\"ollmer Flows

Vargas, Francisco, Ovsianas, Andrius, Fernandes, David, Girolami, Mark, Lawrence, Neil D., Nüsken, Nikolas

arXiv.org Machine Learning

In this work we explore a new framework for approximate Bayesian inference in large datasets based on stochastic control. We advocate stochastic control as a finite time and low variance alternative to popular steady-state methods such as stochastic gradient Langevin dynamics (SGLD). Furthermore, we discuss and adapt the existing theoretical guarantees of this framework and establish connections to already existing VI routines in SDE-based models.